35 research outputs found

    Multivariate sensitivity analysis for a large-scale climate impact and adaptation model

    Get PDF
    We develop a new efficient methodology for Bayesian global sensitivity analysis for large-scale multivariate data. The focus is on computationally demanding models with correlated variables. A multivariate Gaussian process is used as a surrogate model to replace the expensive computer model. To improve the computational efficiency and performance of the model, compactly supported correlation functions are used. The goal is to generate sparse matrices, which give crucial advantages when dealing with large datasets, where we use cross-validation to determine the optimal degree of sparsity. This method was combined with a robust adaptive Metropolis algorithm coupled with a parallel implementation to speed up the convergence to the target distribution. The method was applied to a multivariate dataset from the IMPRESSIONS Integrated Assessment Platform (IAP2), an extension of the CLIMSAVE IAP, which has been widely applied in climate change impact, adaptation and vulnerability assessments. Our empirical results on synthetic and IAP2 data show that the proposed methods are efficient and accurate for global sensitivity analysis of complex models

    Agricultural productivity in past societies: toward an empirically informed model for testing cultural evolutionary hypotheses

    Get PDF
    Agricultural productivity, and its variation in space and time, plays a fundamental role in many theories of human social evolution. However, we often lack systematic information about the productivity of past agricultural systems on a scale large enough to test these theories properly. The effect of climate on crop yields has received a great deal of attention resulting in a range of empirical and process-based models, yet the focus has primarily been on current or future conditions. In this paper, we argue for a “bottom-up” approach that estimates potential productivity based on information about the agricultural practices and technologies used in past societies. Of key theoretical interest is using this information to estimate the carrying high quality historical and archaeological information about past societies in order to infer the temporal and geographic patterns of change in agricultural productivity and potential. We discuss information we need to collect about past agricultural techniques and practices, and introduce a new databank initiative that we have developed for collating the best available historical and archaeological evidence. A key benefit of our approach lies in making explicit the steps in the estimation of past productivities and carrying capacities, and in being able to assess the effects of different modelling assumptions. This is undoubtedly an ambitious task, yet promises to provide important insights into fundamental aspects of past societies, enabling us to test more rigorously key hypotheses about human socio-cultural evolution

    Emulating global climate change impacts on crop yields

    Get PDF
    The potential effects of climate change on the environment and society are many. In order to effectively quantify the uncertainty associated with these effects, highly complex simulation models are run with detailed representations of ecosystem processes. These models are computationally expensive and can involve a computer run of several days. Computationally cheaper models can be obtained from large ensembles of simulations using statistical emulation. The purpose of this paper is to construct a cheaper computational model (emulator) from simulations of the Lund- Potsdam-Jena managed Land (LPJmL), which is a dynamic global vegetation and crop model. This paper focuses on statistical emulation of potential crop yields from LPJmL and an emulator is constructed using a combination of ordinary least squares, principal component analysis and weighted least squares methods. For five climate models, under cross-validation the percentage of variance explained ranges from 60- 88% for the rainfed crops and 62-93% for the irrigated crops. The emulator can be used to predict potential crop yield change under any future climate scenarios and management options

    The Role of Digital Technologies in Responding to the Grand Challenges of the Natural Environment:The Windermere Accord

    Get PDF
    Digital technology is having a major impact on many areas of society, and there is equal opportunity for impact on science. This is particularly true in the environmental sciences as we seek to understand the complexities of the natural environment under climate change. This perspective presents the outcomes of a summit in this area, a unique cross-disciplinary gathering bringing together environmental scientists, data scientists, computer scientists, social scientists, and representatives of the creative arts. The key output of this workshop is an agreed vision in the form of a framework and associated roadmap, captured in the Windermere Accord. This accord envisions a new kind of environmental science underpinned by unprecedented amounts of data, with technological advances leading to breakthroughs in taming uncertainty and complexity, and also supporting openness, transparency, and reproducibility in science. The perspective also includes a call to build an international community working in this important area

    A surrogate-based approach to modelling the impact of hydrodynamic shear stress on biofilm deformation

    No full text
    The aim is to investigate the feasibility of using a surrogate-based method to emulate the deformation and detachment behaviour of a biofilm in response to hydrodynamic shear stress. The influence of shear force and growth rate parameters on the patterns of growth, structure and resulting shape of microbial biofilms was examined. We develop a novel statistical modelling approach to this problem, using a combination of Bayesian Poisson regression and dynamic linear models for the emulation. We observe that the hydrodynamic shear force affects biofilm deformation in line with some literature. Sensitivity results also showed that the shear flow and yield coefficient for heterotrophic bacteria are the two principal mechanisms governing the bacteria detachment. The sensitivity of the model parameters is temporally dynamic, emphasising the significance of conducting the sensitivity analysis across multiple time points. The surrogate models are shown to perform well, and produced ~480 fold increase in computational efficiency. We conclude that a surrogate-based approach is effective, and resulting biofilm structure is determined primarily by a balance between bacteria growth and applied shear stress

    Statistical emulation as a tool for analysing complex multiscale stochastic biological model outputs

    No full text
    The performance of credible simulations in open engineered biological frameworks is an important step for practical application of scientific knowledge to solve real-world problems and enhance our ability to make novel discoveries. Therefore, maximising our potential to explore the range of solutions at frontier level could reduce the potential risk of failure on a large scale. One primary application of this type of knowledge is in the management of wastewater treatment systems. Efficient optimisation of wastewater treatment plant focuses on aggregate outcomes of individual particle-level processes. One of the crucial aspects of engineering biology approach in wastewater treatment study is to run a high complex simulation of biological particles. This type of model can scale from one level to another and can also be used to study how to manage real systems effectively with minimal physical experimentation. To identify crucial features and model water treatment plants on a large scale, there is a need to understand the interactions of microbes at fine resolution using models that could provide the best available representation of micro scale responses. The challenge then becomes how we can transfer this small-scale information to the macroscale process in a computationally efficient and sufficiently accurate way. It has been established that the macro scale characteristics of wastewater treatment plants are the consequences of microscale features of a vast number of individual particles that produce the community of such bacterial (Ofiteru et al. 2014). Nevertheless, simulation of open biological systems is challenging because they often involve a large number of bacteria that ranges from order 1012 to 1018 individual particles and are physically complex. The models are computationally expensive and due to computing constraints, limited sets of scenarios are often possible. A simplified approach to this problem is to use a statistical approximation of the simulation ensembles derived from the complex models which will help in reducing the computational burden. Our aim is to build a cheaper surrogate of the Individual- based (IB) model simulation of biological particle. The main issue we address is to highlight the strategy for emulating high-level summaries from the IB model simulation data. Our approach is to condense the massive, long time series outputs of particles of various species by spatially aggregating to produce the most relevant outputs in the form of floc and biofilms aggregates. The data compression has the benefit of suppressing or reducing some of the nonlinear response features, simplifying the construction of the emulator. Some of the most interesting properties at the mesoscale level like the size, shape, and structure of biofilms and flocs are characterised, see Figure 1. For instance, we characterize the floc size using an equivalent diameter. This strategy enables us to treat the flocs as a ball of a sphere and or fractal depending on the shape, and we approximate the diameter of a sphere that circumscribes its boundary or outline
    corecore